Markov decision process — Markov decision processes (MDPs), named after Andrey Markov, provide a mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision maker. MDPs are useful for… … Wikipedia
Partially observable Markov decision process — A Partially Observable Markov Decision Process (POMDP) is a generalization of a Markov Decision Process. A POMDP models an agent decision process in which it is assumed that the system dynamics are determined by an MDP, but the agent cannot… … Wikipedia
Markov model — In probability theory, a Markov model is a stochastic model that assumes the Markov property. Generally, this assumption enables reasoning and computation with the model that would otherwise be intractable. Contents 1 Introduction 2 Markov chain… … Wikipedia
Markov chain — A simple two state Markov chain. A Markov chain, named for Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized … Wikipedia
Markov process — In probability theory and statistics, a Markov process, named after the Russian mathematician Andrey Markov, is a time varying random phenomenon for which a specific property (the Markov property) holds. In a common description, a stochastic… … Wikipedia
Markov property — In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It was named after the Russian mathematician Andrey Markov.[1] A stochastic process has the Markov property if the… … Wikipedia
Decision field theory — (DFT), is a dynamic cognitive approach to human decision making. It is a cognitive model that describes how people make decisions rather than a rational model that prescribes what people should do. It is also a dynamic model of decision making… … Wikipedia
Andrey Markov — For other people named Andrey Markov, see Andrey Markov (disambiguation). Andrey (Andrei) Andreyevich Markov Born June 14, 1856( … Wikipedia
Info-gap decision theory — is a non probabilistic decision theory that seeks to optimize robustness to failure – or opportuneness for windfall – under severe uncertainty,[1][2] in particular applying sensitivity analysis of the stability radius type[3] to perturbations in… … Wikipedia
Chaîne de Markov — Selon les auteurs, une chaîne de Markov est de manière générale un processus de Markov à temps discret ou un processus de Markov à temps discret et à espace d états discret. En mathématiques, un processus de Markov est un processus stochastique… … Wikipédia en Français
Processus de Markov — En mathématiques, un processus de Markov est un processus stochastique possédant la propriété de Markov. Dans un tel processus, la prédiction du futur à partir du présent n est pas rendue plus précise par des éléments d information concernant le… … Wikipédia en Français